On-Line Learning with Restricted Training Sets: An Exactly Solvable Case

نویسندگان

  • H. C. Rae
  • P. Sollich
چکیده

We solve the dynamics of on-line Hebbian learning in large perceptrons exactly, for the regime where the size of the training set scales linearly with the number of inputs. We consider both noiseless and noisy teachers. Our calculation cannot be extended to non-Hebbian rules, but the solution provides a convenient and welcome benchmark with which to test more general and advanced theories for solving the dynamics of learning with restricted training sets.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Dynamics of on-line Hebbian learning with structurally unrealizable restricted training sets

We present an exact solution for the dynamics of on-line Hebbian learning in neural networks, with restricted and unrealizable training sets. In contrast to other studies on learning with restricted training sets, unrealizability is here caused by structural mismatch, rather than data noise: the teacher machine is a perceptron with a reversed wedge-type transfer function, while the student mach...

متن کامل

On-Line Learning with Restricted Training Sets: Exact Solution as Benchmark for General Theories

We solve the dynamics of on-line Hebbian learning in perceptrons exactly, for the regime where the size of the training set scales linearly with the number of inputs. We consider both noiseless and noisy teachers. Our calculation cannot be extended to non-Hebbian rules, but the solution provides a nice benchmark to test more general and advanced theories for solving the dynamics of learning wit...

متن کامل

Dynamics of Supervised Learning with Restricted Training Sets and Noisy Teachers

We generalize a recent formalism to describe the dynamics of supervised learning in layered neural networks, in the regime where data recycling is inevitable, to the case of noisy teachers. Our theory generates predictions for the evolution in time of trainingand generalization errors, and extends the class of mathematically solvable learning processes in large neural networks to those complica...

متن کامل

On Methods to Keep Learning Away from Intractability

We investigate the complexity of learning from restricted sets of training examples. With the intention to make learning easier we introduce two types of restrictions that describe the permitted training examples. The strength of the restrictions can be tuned by choosing speciic parameters. We ask how strictly their values must be limited to turn NP-complete learning problems into polynomial-ti...

متن کامل

Optimal Self-healing of Smart Distribution Grids Based on Spanning Trees to Improve System Reliability

In this paper, a self-healing approach for smart distribution network is presented based on Graph theory and cut sets. In the proposed Graph theory based approach, the upstream grid and all the existing microgrids are modeled as a common node after fault occurrence. Thereafter, the maneuvering lines which are in the cut sets are selected as the recovery path for alternatives networks by making ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1999